Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 10 de 10
Filtrar
Mais filtros










Base de dados
Intervalo de ano de publicação
1.
Sensors (Basel) ; 24(3)2024 Feb 04.
Artigo em Inglês | MEDLINE | ID: mdl-38339732

RESUMO

Traditional systems for indoor pressure sensing and human activity recognition (HAR) rely on costly, high-resolution mats and computationally intensive neural network-based (NN-based) models that are prone to noise. In contrast, we design a cost-effective and noise-resilient pressure mat system for HAR, leveraging Velostat for intelligent pressure sensing and a novel hyperdimensional computing (HDC) classifier that is lightweight and highly noise resilient. To measure the performance of our system, we collected two datasets, capturing the static and continuous nature of human movements. Our HDC-based classification algorithm shows an accuracy of 93.19%, improving the accuracy by 9.47% over state-of-the-art CNNs, along with an 85% reduction in energy consumption. We propose a new HDC noise-resilient algorithm and analyze the performance of our proposed method in the presence of three different kinds of noise, including memory and communication, input, and sensor noise. Our system is more resilient across all three noise types. Specifically, in the presence of Gaussian noise, we achieve an accuracy of 92.15% (97.51% for static data), representing a 13.19% (8.77%) improvement compared to state-of-the-art CNNs.


Assuntos
Algoritmos , Redes Neurais de Computação , Humanos , Ruído , Atividades Humanas , Movimento
2.
Bioinformatics ; 39(7)2023 07 01.
Artigo em Inglês | MEDLINE | ID: mdl-37369033

RESUMO

MOTIVATION: Driven by technological advances, the throughput and cost of mass spectrometry (MS) proteomics experiments have improved by orders of magnitude in recent decades. Spectral library searching is a common approach to annotating experimental mass spectra by matching them against large libraries of reference spectra corresponding to known peptides. An important disadvantage, however, is that only peptides included in the spectral library can be found, whereas novel peptides, such as those with unexpected post-translational modifications (PTMs), will remain unknown. Open modification searching (OMS) is an increasingly popular approach to annotate modified peptides based on partial matches against their unmodified counterparts. Unfortunately, this leads to very large search spaces and excessive runtimes, which is especially problematic considering the continuously increasing sizes of MS proteomics datasets. RESULTS: We propose an OMS algorithm, called HOMS-TC, that fully exploits parallelism in the entire pipeline of spectral library searching. We designed a new highly parallel encoding method based on the principle of hyperdimensional computing to encode mass spectral data to hypervectors while minimizing information loss. This process can be easily parallelized since each dimension is calculated independently. HOMS-TC processes two stages of existing cascade search in parallel and selects the most similar spectra while considering PTMs. We accelerate HOMS-TC on NVIDIA's tensor core units, which is emerging and readily available in the recent graphics processing unit (GPU). Our evaluation shows that HOMS-TC is 31× faster on average than alternative search engines and provides comparable accuracy to competing search tools. AVAILABILITY AND IMPLEMENTATION: HOMS-TC is freely available under the Apache 2.0 license as an open-source software project at https://github.com/tycheyoung/homs-tc.


Assuntos
Software , Espectrometria de Massas em Tandem , Espectrometria de Massas em Tandem/métodos , Bases de Dados de Proteínas , Peptídeos/química , Ferramenta de Busca , Algoritmos , Biblioteca de Peptídeos
3.
J Proteome Res ; 22(6): 1639-1648, 2023 06 02.
Artigo em Inglês | MEDLINE | ID: mdl-37166120

RESUMO

As current shotgun proteomics experiments can produce gigabytes of mass spectrometry data per hour, processing these massive data volumes has become progressively more challenging. Spectral clustering is an effective approach to speed up downstream data processing by merging highly similar spectra to minimize data redundancy. However, because state-of-the-art spectral clustering tools fail to achieve optimal runtimes, this simply moves the processing bottleneck. In this work, we present a fast spectral clustering tool, HyperSpec, based on hyperdimensional computing (HDC). HDC shows promising clustering capability while only requiring lightweight binary operations with high parallelism that can be optimized using low-level hardware architectures, making it possible to run HyperSpec on graphics processing units to achieve extremely efficient spectral clustering performance. Additionally, HyperSpec includes optimized data preprocessing modules to reduce the spectrum preprocessing time, which is a critical bottleneck during spectral clustering. Based on experiments using various mass spectrometry data sets, HyperSpec produces results with comparable clustering quality as state-of-the-art spectral clustering tools while achieving speedups by orders of magnitude, shortening the clustering runtime of over 21 million spectra from 4 h to only 24 min.


Assuntos
Algoritmos , Peptídeos , Peptídeos/análise , Espectrometria de Massas/métodos , Proteômica/métodos , Análise por Conglomerados
4.
Lancet Reg Health Am ; 19: 100449, 2023 Mar.
Artigo em Inglês | MEDLINE | ID: mdl-36844610

RESUMO

Background: Schools are high-risk settings for SARS-CoV-2 transmission, but necessary for children's educational and social-emotional wellbeing. Previous research suggests that wastewater monitoring can detect SARS-CoV-2 infections in controlled residential settings with high levels of accuracy. However, its effective accuracy, cost, and feasibility in non-residential community settings is unknown. Methods: The objective of this study was to determine the effectiveness and accuracy of community-based passive wastewater and surface (environmental) surveillance to detect SARS-CoV-2 infection in neighborhood schools compared to weekly diagnostic (PCR) testing. We implemented an environmental surveillance system in nine elementary schools with 1700 regularly present staff and students in southern California. The system was validated from November 2020 to March 2021. Findings: In 447 data collection days across the nine sites 89 individuals tested positive for COVID-19, and SARS-CoV-2 was detected in 374 surface samples and 133 wastewater samples. Ninety-three percent of identified cases were associated with an environmental sample (95% CI: 88%-98%); 67% were associated with a positive wastewater sample (95% CI: 57%-77%), and 40% were associated with a positive surface sample (95% CI: 29%-52%). The techniques we utilized allowed for near-complete genomic sequencing of wastewater and surface samples. Interpretation: Passive environmental surveillance can detect the presence of COVID-19 cases in non-residential community school settings with a high degree of accuracy. Funding: County of San Diego, Health and Human Services Agency, National Institutes of Health, National Science Foundation, Centers for Disease Control.

5.
medRxiv ; 2023 Jan 25.
Artigo em Inglês | MEDLINE | ID: mdl-34704096

RESUMO

Background: Schools are high-risk settings for SARS-CoV-2 transmission, but necessary for children's educational and social-emotional wellbeing. Previous research suggests that wastewater monitoring can detect SARS-CoV-2 infections in controlled residential settings with high levels of accuracy. However, its effective accuracy, cost, and feasibility in non-residential community settings is unknown. Methods: The objective of this study was to determine the effectiveness and accuracy of community-based passive wastewater and surface (environmental) surveillance to detect SARS-CoV-2 infection in neighborhood schools compared to weekly diagnostic (PCR) testing. We implemented an environmental surveillance system in nine elementary schools with 1700 regularly present staff and students in southern California. The system was validated from November 2020 - March 2021. Findings: In 447 data collection days across the nine sites 89 individuals tested positive for COVID-19, and SARS-CoV-2 was detected in 374 surface samples and 133 wastewater samples. Ninety-three percent of identified cases were associated with an environmental sample (95% CI: 88% - 98%); 67% were associated with a positive wastewater sample (95% CI: 57% - 77%), and 40% were associated with a positive surface sample (95% CI: 29% - 52%). The techniques we utilized allowed for near-complete genomic sequencing of wastewater and surface samples. Interpretation: Passive environmental surveillance can detect the presence of COVID-19 cases in non-residential community school settings with a high degree of accuracy. Funding: County of San Diego, Health and Human Services Agency, National Institutes of Health, National Science Foundation, Centers for Disease Control.

6.
IEEE J Biomed Health Inform ; 27(1): 202-214, 2023 01.
Artigo em Inglês | MEDLINE | ID: mdl-36136930

RESUMO

Recent years have seen growing interest in leveraging deep learning models for monitoring epilepsy patients based on electroencephalographic (EEG) signals. However, these approaches often exhibit poor generalization when applied outside of the setting in which training data was collected. Furthermore, manual labeling of EEG signals is a time-consuming process requiring expert analysis, making fine-tuning patient-specific models to new settings a costly proposition. In this work, we propose the Maximum-Mean-Discrepancy Decoder (M2D2) for automatic temporal localization and labeling of seizures in long EEG recordings to assist medical experts. We show that M2D2 achieves 76.0% and 70.4% of F1-score for temporal localization when evaluated on EEG data gathered in a different clinical setting than the training data. The results demonstrate that M2D2 yields substantially higher generalization performance than other state-of-the-art deep learning-based approaches.


Assuntos
Epilepsia , Humanos , Convulsões , Eletroencefalografia/métodos , Encéfalo , Algoritmos
7.
Front Neurosci ; 16: 867192, 2022.
Artigo em Inglês | MEDLINE | ID: mdl-35706689

RESUMO

Brain-inspired Hyper-dimensional(HD) computing is a novel and efficient computing paradigm. However, highly parallel architectures such as Processing-in-Memory(PIM) are bottle-necked by reduction operations required such as accumulation. To reduce this bottle-neck of HD computing in PIM, we present Stochastic-HD that combines the simplicity of operations in Stochastic Computing (SC) with the complex task solving capabilities of the latest HD computing algorithms. Stochastic-HD leverages deterministic SC, which enables all of HD operations to be done as highly parallel bitwise operations and removes all reduction operations, thus improving the throughput of PIM. To this end, we propose an in-memory hardware design for Stochastic-HD that exploits its high level of parallelism and robustness to approximation. Our hardware uses in-memory bitwise operations along with associative memory-like operations to enable a fast and energy-efficient implementation. With Stochastic-HD, we were able to reach a comparable accuracy with the Baseline-HD. Furthermore, by proposing an integrated Stochastic-HD retraining approach Stochastic-HD is able to reduce the accuracy loss to just 0.3%. We additionally accelerate the retraining process in our hardware design to create an end-to-end accelerator for Stochastic-HD. Finally, we also add support for HD Clustering to Stochastic-HD, which is the first to map the HD Clustering operations to the stochastic domain. As compared to the best PIM design for HD, Stochastic-HD is also 4.4% more accurate and 43.1× more energy-efficient.

8.
J Chem Theory Comput ; 18(7): 4047-4069, 2022 Jul 12.
Artigo em Inglês | MEDLINE | ID: mdl-35710099

RESUMO

Atomistic Molecular Dynamics (MD) simulations provide researchers the ability to model biomolecular structures such as proteins and their interactions with drug-like small molecules with greater spatiotemporal resolution than is otherwise possible using experimental methods. MD simulations are notoriously expensive computational endeavors that have traditionally required massive investment in specialized hardware to access biologically relevant spatiotemporal scales. Our goal is to summarize the fundamental algorithms that are employed in the literature to then highlight the challenges that have affected accelerator implementations in practice. We consider three broad categories of accelerators: Graphics Processing Units (GPUs), Field-Programmable Gate Arrays (FPGAs), and Application Specific Integrated Circuits (ASICs). These categories are comparatively studied to facilitate discussion of their relative trade-offs and to gain context for the current state of the art. We conclude by providing insights into the potential of emerging hardware platforms and algorithms for MD.


Assuntos
Algoritmos , Simulação de Dinâmica Molecular , Computadores
9.
mSystems ; 7(2): e0137821, 2022 04 26.
Artigo em Inglês | MEDLINE | ID: mdl-35293792

RESUMO

Increasing data volumes on high-throughput sequencing instruments such as the NovaSeq 6000 leads to long computational bottlenecks for common metagenomics data preprocessing tasks such as adaptor and primer trimming and host removal. Here, we test whether faster recently developed computational tools (Fastp and Minimap2) can replace widely used choices (Atropos and Bowtie2), obtaining dramatic accelerations with additional sensitivity and minimal loss of specificity for these tasks. Furthermore, the taxonomic tables resulting from downstream processing provide biologically comparable results. However, we demonstrate that for taxonomic assignment, Bowtie2's specificity is still required. We suggest that periodic reevaluation of pipeline components, together with improvements to standardized APIs to chain them together, will greatly enhance the efficiency of common bioinformatics tasks while also facilitating incorporation of further optimized steps running on GPUs, FPGAs, or other architectures. We also note that a detailed exploration of available algorithms and pipeline components is an important step that should be taken before optimization of less efficient algorithms on advanced or nonstandard hardware. IMPORTANCE In shotgun metagenomics studies that seek to relate changes in microbial DNA across samples, processing the data on a computer often takes longer than obtaining the data from the sequencing instrument. Recently developed software packages that perform individual steps in the pipeline of data processing in principle offer speed advantages, but in practice they may contain pitfalls that prevent their use, for example, they may make approximations that introduce unacceptable errors in the data. Here, we show that differences in choices of these components can speed up overall data processing by 5-fold or more on the same hardware while maintaining a high degree of correctness, greatly reducing the time taken to interpret results. This is an important step for using the data in clinical settings, where the time taken to obtain the results may be critical for guiding treatment.


Assuntos
Metagenômica , Software , Metagenômica/métodos , Algoritmos , Sequenciamento de Nucleotídeos em Larga Escala/métodos , Biologia Computacional/métodos
10.
Annu Int Conf IEEE Eng Med Biol Soc ; 2020: 536-540, 2020 07.
Artigo em Inglês | MEDLINE | ID: mdl-33018045

RESUMO

Recent years have seen a growing interest in the development of non-invasive devices capable of detecting seizures which can be worn in everyday life. Such devices must be lightweight and unobtrusive which severely limit their on-board computing power and battery life. In this paper, we propose a novel technique based on hyperdimensional (HD) computing to detect epileptic seizures from 2-channel surface EEG recordings. The proposed technique eliminates the need for complicated feature extraction techniques required in conventional ML algorithms. The HD algorithm is also simple to implement and does not require expert knowledge for architectural optimizations needed for approaches based on neural networks. In addition, our proposed technique is light-weight and meets the computation and memory constraints of ultra-small devices. Experimental results on a publicly available dataset indicates our approach improves the accuracy compared to state-of-the-art techniques while consuming smaller or comparable power.


Assuntos
Eletroencefalografia , Epilepsia , Algoritmos , Epilepsia/diagnóstico , Humanos , Redes Neurais de Computação , Convulsões/diagnóstico
SELEÇÃO DE REFERÊNCIAS
DETALHE DA PESQUISA
...